翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

quantum mutual information : ウィキペディア英語版
quantum mutual information

In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual information.
== Motivation ==

For simplicity, it will be assumed that all objects in the article are finite-dimensional.
The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables ''p''(''x'', ''y''), the two marginal distributions are
:p(x) = \sum_ p(x,y)\; , \; p(y) = \sum_ p(x,y).
The classical mutual information ''I''(''X'', ''Y'') is defined by
:\;I(X,Y) = S(p(x)) + S(p(y)) - S(p(x,y))
where ''S''(''q'') denotes the Shannon entropy of the probability distribution ''q''.
One can calculate directly
:\; S(p(x)) + S(p(y))
:\; = -(\sum_x p_x \log p(x) + \sum_y p_y \log p(y))

:
\; = -(\sum_x \; ( \sum_ p(x,y') \log \sum_ p(x,y') ) + \sum_y ( \sum_ p(x',y) \log \sum_ p(x',y)))
:\; = -(\sum_ p(x,y) (\log \sum_ p(x,y') + \log \sum_ p(x',y)))
:\; = -\sum_ p(x,y) \log p(x) p(y) .
So the mutual information is
:I(X,Y) = \sum_ p(x,y) \log \frac.
But this is precisely the relative entropy between ''p''(''x'', ''y'') and ''p''(''x'')''p''(''y''). In other words, if we assume the two variables ''x'' and ''y'' to be uncorrelated, mutual information is the ''discrepancy in uncertainty'' resulting from this (possibly erroneous) assumption.
It follows from the property of relative entropy that ''I''(''X'',''Y'') ≥ 0 and equality holds if and only if ''p''(''x'', ''y'') = ''p''(''x'')''p''(''y'').

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「quantum mutual information」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.